Mixtral 8x7B Instruct V0.1 Offloading Demo
MIT
Mixtral is a multilingual text generation model based on a Mixture of Experts (MoE) architecture, supporting English, French, Italian, German, and Spanish.
Large Language Model
Transformers Supports Multiple Languages